85 research outputs found

    Black Hole Models of Quasars

    Get PDF
    Observations of active galactic nuclei are interpreted in terms of a theoretical model involving accretion onto a massive black hole. Optical quasars and Seyfert galaxies are associated with holes accreting near the Eddington rate and radio galaxies with sub-critical accretion. It is argued that magnetic fields are largely responsible for extracting energy and angular momentum from black holes and disks. Recent studies of electron-positron pair plasmas and their possible role in establishing the emergent X-ray spectrum are reviewed. The main evolutionary properties of active galactic nuclei can be interpreted in terms of a simple model in which black holes accrete gas at a rate dictated by the rate of gas supply which decreases with cosmic time. It may be worth searching for eclipsing binary black holes in lower power Seyferts

    Explosive Nucleosynthesis: What we learned and what we still do not understand

    Full text link
    This review touches on historical aspects, going back to the early days of nuclear astrophysics, initiated by B2^2FH and Cameron, discusses (i) the required nuclear input from reaction rates and decay properties up to the nuclear equation of state, continues (ii) with the tools to perform nucleosynthesis calculations and (iii) early parametrized nucleosynthesis studies, before (iv) reliable stellar models became available for the late stages of stellar evolution. It passes then through (v) explosive environments from core-collapse supernovae to explosive events in binary systems (including type Ia supernovae and compact binary mergers), and finally (vi) discusses the role of all these nucleosynthesis production sites in the evolution of galaxies. The focus is put on the comparison of early ideas and present, very recent, understanding.Comment: 11 pages, to appear in Springer Proceedings in Physics (Proc. of Intl. Conf. "Nuclei in the Cosmos XV", LNGS Assergi, Italy, June 2018

    Radio emission from Supernova Remnants

    Get PDF
    The explosion of a supernova releases almost instantaneously about 10^51 ergs of mechanic energy, changing irreversibly the physical and chemical properties of large regions in the galaxies. The stellar ejecta, the nebula resulting from the powerful shock waves, and sometimes a compact stellar remnant, constitute a supernova remnant (SNR). They can radiate their energy across the whole electromagnetic spectrum, but the great majority are radio sources. Almost 70 years after the first detection of radio emission coming from a SNR, great progress has been achieved in the comprehension of their physical characteristics and evolution. We review the present knowledge of different aspects of radio remnants, focusing on sources of the Milky Way and the Magellanic Clouds, where the SNRs can be spatially resolved. We present a brief overview of theoretical background, analyze morphology and polarization properties, and review and critical discuss different methods applied to determine the radio spectrum and distances. The consequences of the interaction between the SNR shocks and the surrounding medium are examined, including the question of whether SNRs can trigger the formation of new stars. Cases of multispectral comparison are presented. A section is devoted to reviewing recent results of radio SNRs in the Magellanic Clouds, with particular emphasis on the radio properties of SN 1987A, an ideal laboratory to investigate dynamical evolution of an SNR in near real time. The review concludes with a summary of issues on radio SNRs that deserve further study, and analyzing the prospects for future research with the latest generation radio telescopes.Comment: Revised version. 48 pages, 15 figure

    Discovery of Radio Emission from the Brown Dwarf LP944-20

    Get PDF
    Brown dwarfs are classified as objects which are not massive enough to sustain nuclear fusion of hydrogen, and are distinguished from planets by their ability to burn deuterium. Old (>10 Myr) brown dwarfs are expected to possess short-lived magnetic fields and, since they no longer generate energy from collapse and accretion, weak radio and X-ray emitting coronae. Several efforts have been undertaken in the past to detect chromospheric activity from the brown dwarf LP944-20 at X-ray and optical wavelengths, but only recently an X-ray flare from this object was detected. Here we report on the discovery of quiescent and flaring radio emission from this source, which represents the first detection of persistent radio emission from a brown dwarf, with luminosities that are several orders of magnitude larger than predicted from an empirical relation between the X-ray and radio luminosities of many stellar types. We show in the context of synchrotron emission, that LP944-20 possesses an unusually weak magnetic field in comparison to active dwarf M stars, which might explain the null results from previous optical and X-ray observations of this source, and the deviation from the empirical relations.Comment: Accepted to Natur

    Galactic Effects on Habitability

    Full text link
    The galactic environment has been suspected to influence planetary habitability in many ways. Very metal-poor regions of the Galaxy, or those largely devoid of atoms more massive than H and He, are thought to be unable to form habitable planets. Moreover, if such planets do form, the young system is subjected to close stellar passages while it resides in its stellar birth cluster. Various potential hazards remain after clusters disperse. For instance, central galactic regions may present risks to habitability via nearby supernovae, gamma ray bursts (GRBs), and frequent comet showers. In addition, planets residing within very wide binary star systems are affected by the Galaxy, as local gravitational perturbations from the Galaxy can increase the binary's eccentricity until it destabilizes the planets it hosts. Here we review the most recent work on the main galactic influences over planetary habitability. Although there must be some metallicity limit below which rocky planets cannot form, recent exoplanet surveys show that they form around stars with a very large range of metallicities. Once formed, the probability of star clusters destabilizing planetary systems only becomes high for rare, extremely long-lived clusters. Regarding threats to habitability from supernovae, GRBs, and comet showers, many recent studies suggest that their hazards are more limited than originally thought. Finally, denser regions of the Galaxy enhance the threat that very wide binary companions pose to planetary habitability, but the probability that a very wide binary star disrupts habitability will always be substantially below 100% for any environment. While some Milky Way regions must be more hospitable to habitable planets than others, it is difficult to state that habitable planets are confined to any well-defined region of the Galaxy or that any other particular region of the Galaxy is uninhabitable.Comment: Invited review chapter, accepted for publication in the "Handbook of Exoplanets"; 19 pages; 2 figure

    Effectiveness of Biodiversity Surrogates for Conservation Planning: Different Measures of Effectiveness Generate a Kaleidoscope of Variation

    Get PDF
    Conservation planners represent many aspects of biodiversity by using surrogates with spatial distributions readily observed or quantified, but tests of their effectiveness have produced varied and conflicting results. We identified four factors likely to have a strong influence on the apparent effectiveness of surrogates: (1) the choice of surrogate; (2) differences among study regions, which might be large and unquantified (3) the test method, that is, how effectiveness is quantified, and (4) the test features that the surrogates are intended to represent. Analysis of an unusually rich dataset enabled us, for the first time, to disentangle these factors and to compare their individual and interacting influences. Using two data-rich regions, we estimated effectiveness using five alternative methods: two forms of incidental representation, two forms of species accumulation index and irreplaceability correlation, to assess the performance of ‘forest ecosystems’ and ‘environmental units’ as surrogates for six groups of threatened species—the test features—mammals, birds, reptiles, frogs, plants and all of these combined. Four methods tested the effectiveness of the surrogates by selecting areas for conservation of the surrogates then estimating how effective those areas were at representing test features. One method measured the spatial match between conservation priorities for surrogates and test features. For methods that selected conservation areas, we measured effectiveness using two analytical approaches: (1) when representation targets for the surrogates were achieved (incidental representation), or (2) progressively as areas were selected (species accumulation index). We estimated the spatial correlation of conservation priorities using an index known as summed irreplaceability. In general, the effectiveness of surrogates for our taxa (mostly threatened species) was low, although environmental units tended to be more effective than forest ecosystems. The surrogates were most effective for plants and mammals and least effective for frogs and reptiles. The five testing methods differed in their rankings of effectiveness of the two surrogates in relation to different groups of test features. There were differences between study areas in terms of the effectiveness of surrogates for different test feature groups. Overall, the effectiveness of the surrogates was sensitive to all four factors. This indicates the need for caution in generalizing surrogacy tests

    Gravitational-wave research as an emerging field in the Max Planck Society. The long roots of GEO600 and of the Albert Einstein Institute

    Full text link
    On the occasion of the 50th anniversary since the beginning of the search for gravitational waves at the Max Planck Society, and in coincidence with the 25th anniversary of the foundation of the Albert Einstein Institute, we explore the interplay between the renaissance of general relativity and the advent of relativistic astrophysics following the German early involvement in gravitational-wave research, to the point when gravitational-wave detection became established by the appearance of full-scale detectors and international collaborations. On the background of the spectacular astrophysical discoveries of the 1960s and the growing role of relativistic astrophysics, Ludwig Biermann and his collaborators at the Max Planck Institute for Astrophysics in Munich became deeply involved in research related to such new horizons. At the end of the 1960s, Joseph Weber's announcements claiming detection of gravitational waves sparked the decisive entry of this group into the field, in parallel with the appointment of the renowned relativist Juergen Ehlers. The Munich area group of Max Planck institutes provided the fertile ground for acquiring a leading position in the 1970s, facilitating the experimental transition from resonant bars towards laser interferometry and its innovation at increasingly large scales, eventually moving to a dedicated site in Hannover in the early 1990s. The Hannover group emphasized perfecting experimental systems at pilot scales, and never developed a full-sized detector, rather joining the LIGO Scientific Collaboration at the end of the century. In parallel, the Max Planck Institute for Gravitational Physics (Albert Einstein Institute) had been founded in Potsdam, and both sites, in Hannover and Potsdam, became a unified entity in the early 2000s and were central contributors to the first detection of gravitational waves in 2015.Comment: 94 pages. Enlarged version including new results from further archival research. A previous version appears as a chapter in the volume The Renaissance of General Relativity in Context, edited by A. Blum, R. Lalli and J. Renn (Boston: Birkhauser, 2020

    svclassify: a method to establish benchmark structural variant calls

    Get PDF
    The human genome contains variants ranging in size from small single nucleotide polymorphisms (SNPs) to large structural variants (SVs). High-quality benchmark small variant calls for the pilot National Institute of Standards and Technology (NIST) Reference Material (NA12878) have been developed by the Genome in a Bottle Consortium, but no similar high-quality benchmark SV calls exist for this genome. Since SV callers output highly discordant results, we developed methods to combine multiple forms of evidence from multiple sequencing technologies to classify candidate SVs into likely true or false positives. Our method (svclassify) calculates annotations from one or more aligned bam files from many high-throughput sequencing technologies, and then builds a one-class model using these annotations to classify candidate SVs as likely true or false positives. We first used pedigree analysis to develop a set of high-confidence breakpoint-resolved large deletions. We then used svclassify to cluster and classify these deletions as well as a set of high-confidence deletions from the 1000 Genomes Project and a set of breakpoint-resolved complex insertions from Spiral Genetics. We find that likely SVs cluster separately from likely non-SVs based on our annotations, and that the SVs cluster into different types of deletions. We then developed a supervised one-class classification method that uses a training set of random non-SV regions to determine whether candidate SVs have abnormal annotations different from most of the genome. To test this classification method, we use our pedigree-based breakpoint-resolved SVs, SVs validated by the 1000 Genomes Project, and assembly-based breakpoint-resolved insertions, along with semi-automated visualization using svviz. We find that candidate SVs with high scores from multiple technologies have high concordance with PCR validation and an orthogonal consensus method MetaSV (99.7 % concordant), and candidate SVs with low scores are questionable. We distribute a set of 2676 high-confidence deletions and 68 high-confidence insertions with high svclassify scores from these call sets for benchmarking SV callers. We expect these methods to be particularly useful for establishing high-confidence SV calls for benchmark samples that have been characterized by multiple technologies.https://doi.org/10.1186/s12864-016-2366-

    Flexible Manufacturing Systems: background examples and models

    Full text link
    In this paper, we discuss recent innovations in manufacturing technology and their implications on the design and control of manufacturing systems. Recognizing the need to respond properly to rapidly changing market demands, we discuss several types of flexibility that can be incorporated in our production organisation to achieve this goal. We show how the concept of a Flexible Manufacturing System (FMS) naturally arises as an attempt to combine the advantages of traditional Job Shops and dedicated production lines.The main body of the paper is devoted to a classification of FMS problem areas and a review of models developed to understand and solve these problems. For each problem area, a number of important contributions in the literature is indicated. The reader, interested in the applications of Operations Research models but not familiar with the technical background of FMS’s, will find the descriptions of some essential FMS elements useful. Some final remarks and directions for future research conclude the paper.<br/
    • …
    corecore